45 research outputs found

    Optimal Scheduling Policy Determination for High Speed Downlink Packet Access

    Get PDF
    Abstract — In this paper, we present an analytic model and methodology to determine optimal scheduling policy that involves two dimension space allocation: time and code, in High Speed Downlink Packet Access (HSDPA) system. A discrete stochastic dynamic programming model for the HSDPA downlink scheduler is presented. Value iteration is then used to solve for optimal policy. This framework is used to find the optimal scheduling policy for the case of two users sharing the same cell. Simulation is used to study the performance of the resulted optimal policy using Round Robin (RR) scheduler as a baseline. The policy granularity is introduced to reduce the computational complexity by reducing the action space. The results showed that finer granularity (down to 5 codes) enhances the performance significantly. However, the enhancement gained when using even finer granularity was marginal and does not justify the added complexity. The behaviour of the value function was observed to characterize the optimal scheduling policy. These observations is then used to develop a heuristic scheduling policy. The devised heuristic policy has much less computational complexity which makes it easy to deploy and with only slight reduction in performance compared to the optimal policy according to the simulation results. I

    On the Reliability of LTE Random Access: Performance Bounds for Machine-to-Machine Burst Resolution Time

    Full text link
    Random Access Channel (RACH) has been identified as one of the major bottlenecks for accommodating massive number of machine-to-machine (M2M) users in LTE networks, especially for the case of burst arrival of connection requests. As a consequence, the burst resolution problem has sparked a large number of works in the area, analyzing and optimizing the average performance of RACH. However, the understanding of what are the probabilistic performance limits of RACH is still missing. To address this limitation, in the paper, we investigate the reliability of RACH with access class barring (ACB). We model RACH as a queuing system, and apply stochastic network calculus to derive probabilistic performance bounds for burst resolution time, i.e., the worst case time it takes to connect a burst of M2M devices to the base station. We illustrate the accuracy of the proposed methodology and its potential applications in performance assessment and system dimensioning.Comment: Presented at IEEE International Conference on Communications (ICC), 201

    Statistical Delay Bound for WirelessHART Networks

    Full text link
    In this paper we provide a performance analysis framework for wireless industrial networks by deriving a service curve and a bound on the delay violation probability. For this purpose we use the (min,x) stochastic network calculus as well as a recently presented recursive formula for an end-to-end delay bound of wireless heterogeneous networks. The derived results are mapped to WirelessHART networks used in process automation and were validated via simulations. In addition to WirelessHART, our results can be applied to any wireless network whose physical layer conforms the IEEE 802.15.4 standard, while its MAC protocol incorporates TDMA and channel hopping, like e.g. ISA100.11a or TSCH-based networks. The provided delay analysis is especially useful during the network design phase, offering further research potential towards optimal routing and power management in QoS-constrained wireless industrial networks.Comment: Accepted at PE-WASUN 201

    A Network Calculus Approach for the Analysis of Multi-Hop Fading Channels

    Full text link
    A fundamental problem in the delay and backlog analysis across multi-hop paths in wireless networks is how to account for the random properties of the wireless channel. Since the usual statistical models for radio signals in a propagation environment do not lend themselves easily to a description of the available service rate on a wireless link, the performance analysis of wireless networks has resorted to higher-layer abstractions, e.g., using Markov chain models. In this work, we propose a network calculus that can incorporate common statistical models of fading channels and obtain statistical bounds on delay and backlog across multiple nodes. We conduct the analysis in a transfer domain, which we refer to as the `SNR domain', where the service process at a link is characterized by the instantaneous signal-to-noise ratio at the receiver. We discover that, in the transfer domain, the network model is governed by a dioid algebra, which we refer to as (min,x)-algebra. Using this algebra we derive the desired delay and backlog bounds. An application of the analysis is demonstrated for a simple multi-hop network with Rayleigh fading channels and for a network with cross traffic.Comment: 26 page
    corecore